|
|
Multi-aspect Sentiment Attention Modeling for Sentiment Classification of Educational Big Data |
ZHAI Guanlin1,2, YANG Yan1,2, WANG Heng1,2, DU Shengdong1,2 |
1.School of Information Science and Technology, Southwest Jiao- tong University, Chengdu 611756 2.Key Laboratory of Cloud Computing and Intelligent Techni-que, Sichuan Province, Southwest Jiaotong University, Cheng-du 611756 |
|
|
Abstract Aiming at inefficiency and heavy workloads of college curriculum evaluation methods, a multi-aspect sentiment attention modeling(multi-ASAM) is proposed. Multi-ASAM concatenates a sentence and various aspects of the sentence by neural networks and adds emotional resources attention. To achieve better classification results, influence of relationships between aspects on emotinal polarity and contribution of emotional resources to emotional polarity is taken into auount in multi-ASAM. Experimental results show that Multi-ASAM is improved compared with other methods in the application of education and other fields.
|
Received: 12 May 2019
|
|
Fund:Supported by National Natural Science Foundation of China(No.61572407), National Key Technology Research and Development Program(No.2015BAH19F02), Southwest Jiaotong University Key Education Reform Project(No.1802028) |
Corresponding Authors:
YANG Yan, Ph.D., professor. Her research interests include artificial intelligence, big data analysis and mining, ensemble learning.
|
About author:: ZHAI Guanlin, master student. His research interests include data mining, deep learning and natural language processing;WANG Heng, master student. His research interests include deep learning and natural language processing;DU Shengdong, Ph.D., lecturer. His research interests include data mining, deep learning and big data analysis. |
|
|
|
[1] BOUSBIA N, BELAMRI I. Which Contribution Does EDM Provide to Computer-Based Learning Environments? // PEÑA-AYALA A, ed. Educational Data Mining. Berlin, Germany: Springer, 2014: 3-28. [2] GRELLER W, DRACHSLER H. Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 2012, 15(3): 42-57. [3] KALAKOSKI V, KANNISTO H, DRUPSTEEN L. Enhancing Lear-ning at Work. How to Combine Theoretical and Data-Driven Approaches, and Individual, Behavioural, and Organizational Levels of Data? // Proc of the 23rd European Symposium on Artificial Neural Networks. Berlin, Germany: Springer, 2015: 331-336. [4] NASUKAWA T, YI J. Sentiment Analysis: Capturing Favorability Using Natural Language Processing // Proc of the 2nd International Conference on Knowledge Capture. New York, USA: ACM, 2003: 70-77. [5] LIU B. Sentiment Analysis and Opinion Mining // HIRST G, ed. Synthesis Lectures on Human Language Technologies. San Rafael, USA: Morgan & Claypool Publishers, 2012: 1-167. [6] PANG B, LEE L. Opinion Mining and Sentiment Analysis. Foundations and Trends® in Information Retrieval, 2008, 2(1/2): 1-135. [7] PANG B, LEE L, VAITHYANATHAN S. Thumbs up? Sentiment Classification Using Machine Learning Techniques[C/OL]. [2019-07-29]. https://arxiv.org/pdf/cs/0205070.pdf. [8] KIM Y. Convolutional Neural Networks for Sentence Classification[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1408.5882.pdf. [9] SHIN B, LEE T, CHOI J D. Lexicon Integrated CNN Models with Attention for Sentiment Analysis[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1610.06272.pdf. [10] HOCHREITER S, SCHMIDHUBER J. Long Short-Term Memory. Neural Computation, 1997, 9(8): 1735-1780. [11] QIAN Q, HUANG M L, ZHU X Y. Linguistically Regularized LSTMs for Sentiment Classification[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1611.03949v1.pdf. [12] ZHOU X J, WAN X J, XIAO J G. Attention-Based LSTM Network for Cross-Lingual Sentiment Classification // Proc of the Confe-rence on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2016: 247-256. [13] YANG Z C, YANG D Y, DYER C, et al. Hierarchical Attention Networks for Document Classification // Proc of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg, USA: ACL, 2016: 1480-1489. [14] LIN Z H, FENG M W, DOS SANTOS C N, et al. A Structured Self-attentive Sentence Embedding[C/OL]. [2019-7-29]. https://openreview.net/pdf?id=BJC_jUqxe. [15] WANG Y Q, HUANG M L, ZHAO L, et al. Attention-Based LS-TM for Aspect-Level Sentiment Classification // Proc of the Confe-rence on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2016: 606-615. [16] MA D H, LI S J, ZHANG X D, et al. Interactive Attention Networks for Aspect-Level Sentiment Classification // Proc of the 26th International Joint Conference on Artificial Intelligence. New York, USA: ACM, 2017: 4068-4074. [17] TAY Y, TUAN L A, HUI S C. Learning to Attend via Word-Aspect Associative Fusion for Aspect-Based Sentiment Analysis // Proc of the 32nd AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2018: 5956-5963. [18] LI X, BING L D, LAM W, et al. Transformation Networks for Target-Oriented Sentiment Classification // Proc of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2018: 946-956. [19] LI C, GUO X X, MEI Q Z. Deep Memory Ntworks for Attitude Identification // Proc of the 10th ACM International Conference on Web Search and Data Mining. New York, USA: ACM, 2017: 671-680. [20] CHUNG J, GULCEHRE C, CHO K H, et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1412.3555.pdf. [21] SUKHBAATAR S, SZLAM A, WESTON J, et al. End-to-End Memory Networks[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1503.08895.pdf. [22] WESTON J, CHOPRA S, BORDES A. Memory Networks[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1410.3916.pdf. [23] KINGMA D P, BA J L. ADAM: A Method for Stochastic Optimization[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1412.6980.pdf. [24] HU M Q, LIU B. Mining and Summarizing Customer Reviews // Proc of the 10th ACM SIGKDD International Conference on Know-ledge Discovery and Data Mining. New York, USA: ACM, 2004: 168-177. [25] PENNINGTON J, SOCHER R, MANNING C D. Glove: Global Vectors for Word Representation // Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2014: 1532-1543. [26] TANG D Y, QIN B, FENG X C, et al. Effective LSTMs for Target-Dependent Sentiment Classification[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1512.01100.pdf. [27] VASWANI A, SHAZEER N, PARMAR N, et al. Attention Is All You Need[C/OL]. [2019-07-29]. https://arxiv.org/pdf/1706.03762.pdf. |
|
|
|